Semantic Music Information Retrieval Using Collaborative Indexing and Filtering
Identifieur interne : 000090 ( Main/Exploration ); précédent : 000089; suivant : 000091Semantic Music Information Retrieval Using Collaborative Indexing and Filtering
Auteurs : C. H. C. Leung [Hong Kong] ; W. S. Chan [Hong Kong]Source :
- Lecture Notes in Electrical Engineering [ 1876-1100 ] ; 2010.
Abstract
Abstract: With the rapid development of multimedia technology, digital music has become increasingly available and it constitutes a significant component of multimedia contents on the Internet. Since digital music can be represented in various forms, formats, and dimensions, searching such information is far more challenging than text-based search. While some basic forms of music retrieval is available on the Internet, these tend to be inflexible and have significant limitations. Currently, most of these music retrieval systems only rely on shallow music information (e.g., metadata, album title, lyrics, etc). Here, we present an approach for deep content-based music information retrieval, which focuses on high-level human perception, incorporating subtle nuances and emotional impression on the music (e.g., music styles, tempo, genre, mood, instrumental combinations etc.). We also provide a critical evaluation of the most common current Music Information Retrieval (MIR) approaches and propose an innovative adaptive method for music information search that overcomes the current limitations. The main focus of our approach is concerned with music discovery and recovery by collaborative semantic indexing and user relevance feedback analysis. Through successive usage of our indexing model, novel music content indexing can be built from deep user knowledge incrementally and collectively by accumulating users’ judgment and intelligence.
Url:
DOI: 10.1007/978-90-481-9794-1_65
Affiliations:
Links toward previous steps (curation, corpus...)
- to stream Istex, to step Corpus: 000630
- to stream Istex, to step Curation: 000630
- to stream Istex, to step Checkpoint: 000043
- to stream Main, to step Merge: 000090
- to stream Main, to step Curation: 000090
Le document en format XML
<record><TEI wicri:istexFullTextTei="biblStruct"><teiHeader><fileDesc><titleStmt><title xml:lang="en">Semantic Music Information Retrieval Using Collaborative Indexing and Filtering</title>
<author><name sortKey="Leung, C H C" sort="Leung, C H C" uniqKey="Leung C" first="C. H. C." last="Leung">C. H. C. Leung</name>
</author>
<author><name sortKey="Chan, W S" sort="Chan, W S" uniqKey="Chan W" first="W. S." last="Chan">W. S. Chan</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">ISTEX</idno>
<idno type="RBID">ISTEX:B01D5084A433A8FE7C5E49FEF9BD02C389C1232A</idno>
<date when="2011" year="2011">2011</date>
<idno type="doi">10.1007/978-90-481-9794-1_65</idno>
<idno type="url">https://api.istex.fr/document/B01D5084A433A8FE7C5E49FEF9BD02C389C1232A/fulltext/pdf</idno>
<idno type="wicri:Area/Istex/Corpus">000630</idno>
<idno type="wicri:explorRef" wicri:stream="Istex" wicri:step="Corpus" wicri:corpus="ISTEX">000630</idno>
<idno type="wicri:Area/Istex/Curation">000630</idno>
<idno type="wicri:Area/Istex/Checkpoint">000043</idno>
<idno type="wicri:explorRef" wicri:stream="Istex" wicri:step="Checkpoint">000043</idno>
<idno type="wicri:doubleKey">1876-1100:2011:Leung C:semantic:music:information</idno>
<idno type="wicri:Area/Main/Merge">000090</idno>
<idno type="wicri:Area/Main/Curation">000090</idno>
<idno type="wicri:Area/Main/Exploration">000090</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title level="a" type="main" xml:lang="en">Semantic Music Information Retrieval Using Collaborative Indexing and Filtering</title>
<author><name sortKey="Leung, C H C" sort="Leung, C H C" uniqKey="Leung C" first="C. H. C." last="Leung">C. H. C. Leung</name>
<affiliation wicri:level="1"><country xml:lang="fr">Hong Kong</country>
<wicri:regionArea>Hong Kong Baptist University, Kowloon</wicri:regionArea>
<wicri:noRegion>Kowloon</wicri:noRegion>
</affiliation>
<affiliation wicri:level="1"><country wicri:rule="url">Hong Kong</country>
</affiliation>
</author>
<author><name sortKey="Chan, W S" sort="Chan, W S" uniqKey="Chan W" first="W. S." last="Chan">W. S. Chan</name>
<affiliation wicri:level="1"><country xml:lang="fr">Hong Kong</country>
<wicri:regionArea>Hong Kong Baptist University, Kowloon</wicri:regionArea>
<wicri:noRegion>Kowloon</wicri:noRegion>
</affiliation>
</author>
</analytic>
<monogr></monogr>
<series><title level="s">Lecture Notes in Electrical Engineering</title>
<imprint><date>2010</date>
</imprint>
<idno type="ISSN">1876-1100</idno>
<idno type="eISSN">1876-1119</idno>
<idno type="ISSN">1876-1100</idno>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt><idno type="ISSN">1876-1100</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass></textClass>
<langUsage><language ident="en">en</language>
</langUsage>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Abstract: With the rapid development of multimedia technology, digital music has become increasingly available and it constitutes a significant component of multimedia contents on the Internet. Since digital music can be represented in various forms, formats, and dimensions, searching such information is far more challenging than text-based search. While some basic forms of music retrieval is available on the Internet, these tend to be inflexible and have significant limitations. Currently, most of these music retrieval systems only rely on shallow music information (e.g., metadata, album title, lyrics, etc). Here, we present an approach for deep content-based music information retrieval, which focuses on high-level human perception, incorporating subtle nuances and emotional impression on the music (e.g., music styles, tempo, genre, mood, instrumental combinations etc.). We also provide a critical evaluation of the most common current Music Information Retrieval (MIR) approaches and propose an innovative adaptive method for music information search that overcomes the current limitations. The main focus of our approach is concerned with music discovery and recovery by collaborative semantic indexing and user relevance feedback analysis. Through successive usage of our indexing model, novel music content indexing can be built from deep user knowledge incrementally and collectively by accumulating users’ judgment and intelligence.</div>
</front>
</TEI>
<affiliations><list><country><li>Hong Kong</li>
</country>
</list>
<tree><country name="Hong Kong"><noRegion><name sortKey="Leung, C H C" sort="Leung, C H C" uniqKey="Leung C" first="C. H. C." last="Leung">C. H. C. Leung</name>
</noRegion>
<name sortKey="Chan, W S" sort="Chan, W S" uniqKey="Chan W" first="W. S." last="Chan">W. S. Chan</name>
<name sortKey="Leung, C H C" sort="Leung, C H C" uniqKey="Leung C" first="C. H. C." last="Leung">C. H. C. Leung</name>
</country>
</tree>
</affiliations>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Wicri/Musique/explor/DebussyV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000090 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 000090 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Wicri/Musique |area= DebussyV1 |flux= Main |étape= Exploration |type= RBID |clé= ISTEX:B01D5084A433A8FE7C5E49FEF9BD02C389C1232A |texte= Semantic Music Information Retrieval Using Collaborative Indexing and Filtering }}
This area was generated with Dilib version V0.6.33. |